# Multiple Sizes Available
Openelm 450M Instruct
OpenELM is a set of open-source efficient language models that employ a hierarchical scaling strategy to optimize parameter allocation, including pre-trained and instruction-tuned versions ranging from 270 million to 3 billion parameters.
Large Language Model
Transformers

O
apple
114.41k
47
Roberta Base Word Chinese Cluecorpussmall
A Chinese tokenized version of the RoBERTa medium model pre-trained on CLUECorpusSmall corpus, with tokenization processing to enhance sequence handling efficiency
Large Language Model Chinese
R
uer
184
9
Ptt5 Large T5 Vocab
MIT
PTT5 is a T5 model pretrained on the BrWac corpus, specifically optimized for Portuguese, offering multiple sizes and vocabulary choices.
Large Language Model
Transformers Other

P
unicamp-dl
45
2
Roberta Small Word Chinese Cluecorpussmall
A Chinese word-level RoBERTa medium model pretrained on CLUECorpusSmall, outperforming character-level models in multiple tasks
Large Language Model Chinese
R
uer
33
2
Chinese Roberta L 12 H 768
Chinese pre-trained language model based on RoBERTa architecture, with hidden layer dimension of 512 and 8 Transformer layers
Large Language Model Chinese
C
uer
419
13
Featured Recommended AI Models